Convergence for a class of delayed recurrent neural networks without M-matrix condition

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Event-Triggered State Estimation for a Class of Delayed Recurrent Neural Networks with Sampled-Data Information

and Applied Analysis 3 of stochastic nonlinear systems with multiple bounded time delays. In 31 , the sampleddata synchronization control problem was addressed, where the sampling period was time varying and switched between two different values in a random way. It is worth noting that most of the above results were involved the traditional approach of sampling at prespecified time instances, w...

متن کامل

Global output convergence for delayed recurrent neural networks under impulsive effects

In this paper, we investigate convergence of state output for a class of delayed recurrent neural networks with impulsive effects. Based on properties of time-varying inputs and monotonicity of activation function, we establish some sufficient conditions to guarantee output convergence of the networks in which state variable subjected to impulsive displacements at fixed moments of time.

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

On the global output convergence of a class of recurrent neural networks with time-varying inputs

This paper studies the global output convergence of a class of recurrent neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying inputs. We establish two sufficient conditions for global output convergence of this class of neural networks. Symmetry in the connection weight matrix is not required in the pres...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Applied Mathematics

سال: 2009

ISSN: 0377-0427

DOI: 10.1016/j.cam.2009.07.006